sublinear time orthogonal tensor decomposition
Sublinear Time Orthogonal Tensor Decomposition
Their algorithm is based on computing sketches of the input tensor, which requires reading the entire input. We show in a number of cases one can achieve the same theoretical guarantees in sublinear time, i.e., even without reading most of the input tensor. Instead of using sketches to estimate inner products in tensor decomposition algorithms, we use importance sampling. To achieve sublinear time, we need to know the norms of tensor slices, and we show how to do this in a number of important cases. For symmetric tensors $ T = \sum_{i=1}^k \lambda_i u_i^{\otimes p}$ with $\lambda_i > 0$ for all i, we estimate such norms in sublinear time whenever p is even. For the important case of p = 3 and small values of k, we can also estimate such norms. For asymmetric tensors sublinear time is not possible in general, but we show if the tensor slice norms are just slightly below $\| T \|_F$ then sublinear time is again possible. One of the main strengths of our work is empirical - in a number of cases our algorithm is orders of magnitude faster than existing methods with the same accuracy.
Reviews: Sublinear Time Orthogonal Tensor Decomposition
This paper presents a randomized method for decomposing a symmetric, orthogonal tensor that is "nearly low rank". The fact that the tensor is symmetric and composed of orthogonal components may sound restrictive, but this is exactly the sort of tensor decomposition problem that comes up in spectral methods that solve Latent Dirichlet Allocation problems. So, it's an interesting model to study. The method introduced very closely follows work in a paper from NIPS 2015, "Fast and guaranteed tensor decomposition via sketching" which uses a standard tensor power method for decomposition, but speeds up each iteration by accelerating the required tensor contractions (vector dot products against the tensor) via randomized sketching techniques. This paper uses the same algorithm, but instead of using sketches based on "random projections", which take random linear combinations of all of the components of the tensor, it uses a subsampling technique to approximate the contractions. The authors main selling point is that this approach should allow for better running times since they don't have to touch every entry in the tensor with each iteration.
Sublinear Time Orthogonal Tensor Decomposition
Song, Zhao, Woodruff, David, Zhang, Huan
Their algorithm is based on computing sketches of the input tensor, which requires reading the entire input. We show in a number of cases one can achieve the same theoretical guarantees in sublinear time, i.e., even without reading most of the input tensor. Instead of using sketches to estimate inner products in tensor decomposition algorithms, we use importance sampling. To achieve sublinear time, we need to know the norms of tensor slices, and we show how to do this in a number of important cases. For symmetric tensors $ T \sum_{i 1} k \lambda_i u_i {\otimes p}$ with $\lambda_i 0$ for all i, we estimate such norms in sublinear time whenever p is even.